Rank-Adaptive Time Integration of Tree Tensor Networks

نویسندگان

چکیده

A rank-adaptive integrator for the approximate solution of high-order tensor differential equations by tree networks is proposed and analyzed. In a recursion from leaves to root, updates bases then evolves connection tensors Galerkin method in augmented subspace spanned new old bases. This followed rank truncation within specified error tolerance. The memory requirements are linear order maximal mode dimension. robust small singular values matricizations tensors. Up error, which controlled given tolerance, preserves norm energy Schrödinger equations, it dissipates gradient systems. Numerical experiments with basic quantum spin system illustrate behavior algorithm.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Time Integration of Tensor Trains

A robust and efficient time integrator for dynamical tensor approximation in the tensor train or matrix product state format is presented. The method is based on splitting the projector onto the tangent space of the tensor manifold. The algorithm can be used for updating time-dependent tensors in the given data-sparse tensor train / matrix product state format and for computing an approximate s...

متن کامل

Tensor Regression Networks with various Low-Rank Tensor Approximations

Tensor regression networks achieve high rate of compression of model parameters in multilayer perceptrons (MLP) while having slight impact on performances. Tensor regression layer imposes low-rank constraints on the tensor regression layer which replaces the flattening operation of traditional MLP. We investigate tensor regression networks using various low-rank tensor approximations, aiming to...

متن کامل

Fast Tree-Structured Recursive Neural Tensor Networks

In this project we explore different ways in which we can optimize the computation of training a Tree-structured RNTN, in particular batching techniques in combining many matrix-vector multiplications into matrix-matrix multiplications, and many tensor-vector operations into tensor-matrix operations. We assume that training is performed using mini-batch AdaGrad algorithm, and explore how we can...

متن کامل

Nonlinear Adaptive Algorithms on Rank-One Tensor Models

This work proposes a low complexity nonlinearity model and develops adaptive algorithms over it. The model is based on the decomposable—or rank-one, in tensor language— Volterra kernels. It may also be described as a product of FIR filters, which explains its low-complexity. The rank-one model is also interesting because it comes from a well-posed problem in approximation theory. The paper uses...

متن کامل

Beyond Low Rank: A Data-Adaptive Tensor Completion Method

Low rank tensor representation underpins much of recent progress in tensor completion. In real applications, however, this approach is confronted with two challenging problems, namely (1) tensor rank determination; (2) handling real tensor data which only approximately fulfils the low-rank requirement. To address these two issues, we develop a data-adaptive tensor completion model which explici...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: SIAM Journal on Numerical Analysis

سال: 2023

ISSN: ['0036-1429', '1095-7170']

DOI: https://doi.org/10.1137/22m1473790